LLM Locally
To put an LLM locally (which works similar to ChatGPT) we need two components - Ollama and Open WebUI. In this process we need to use both Ollama and Open WebUI, that is why I made a separate file for detailing the process of making LLM work locally.
Ollama manages the downloading, loading/unloading, deleting, making new models and more of LLMs gracefully.
Open WebUI is the user interface similar to ChatGPT but with much more features hidden inside.
Change Ollama Models Directory in Ubuntu Linux
1.1. Copy installed ollama models from one disk to another
cp -r <curret/path/to/models> <destination/to/store/model/files>
- Example command:
cp -r /mnt/INTERNAL/Ollama/.ollama /home/appuser/Ollama/.ollama
1.2. Get it from any other system (Linux) in your network
2. Stop Ollama Service (optional but recommended)
sudo systemctl stop ollama
3. Get user and group owner names of your destination folder
Go to the folder where you're going to keep ollama files
cd /home/appuser/Ollama/.ollama (Example)
Check details of all files
ls -lk -a
4. Create and Override Ollama service using config file
sudo mkdir -p /etc/systemd/system/ollama.service.d; sudo nano /etc/systemd/system/ollama.service.d/override.conf
Add the path
, user
, and group
of new location in the file
Example
[Service]
Environment="OLLAMA_MODELS=/home/appuser/ollama/.ollama/models" # change this path to your path
User=appuser # change this path to your User
Group=appuser # change this path to your Group
5. Restart System Daemon (this is not a system reboot) and Ollama
sudo systemctl daemon-reload
sudo systemctl restart ollama
6. Check the working
ollama list
7. Try downloading a model and check if it is saving to the new location
ollama run <model_name>
- Get new model name from
ollama.com/models
Optionals
Check details of all directories
ls -ld -a
Status of Ollama
To see status of ollama (whether it is running, stopped, exited, error)
sudo systemctl status ollama
Error logs of Ollama
To see the error logs of ollama (limiting to the latest 50 lines)
journalctl -u ollama -n 50
Ollama and Open WebUI from separate computers
hidden